6 research outputs found
Recommended from our members
The computational face for facial emotion analysis: Computer based emotion analysis from the face
Facial expressions are considered to be the most revealing way of understanding the human psychological state during face-to-face communication. It is believed that a more natural interaction between humans and machines can be undertaken through the detailed understanding of the different facial expressions which imitate the manner by which humans communicate with each other.
In this research, we study the different aspects of facial emotion detection, analysis and investigate possible hidden identity clues within the facial expressions. We study a deeper aspect of facial expressions whereby we try to identify gender and human identity - which can be considered as a form of emotional biometric - using only the dynamic characteristics of the smile expressions. Further, we present a statistical model for analysing the relationship between facial features and Duchenne (real) and non-Duchenne (posed) smiles. Thus, we identify that the expressions in the eyes contain discriminating features between Duchenne and non-Duchenne smiles.
Our results indicate that facial expressions can be identified through facial movement analysis models where we get an accuracy rate of 86% for classifying the six universal facial expressions and 94% for classifying the common 18 facial action units. Further, we successfully identify the gender using only the dynamic characteristics of the smile expression whereby we obtain an 86% classification rate. Likewise, we present a framework to study the possibility of using the smile as a biometric whereby we show that the human smile is unique and stable.Al-Zaytoonah Universit
Recommended from our members
A genuine smile is indeed in the eyes – The computer aided non-invasive analysis of the exact weight distribution of human smiles across the face
YesUnderstanding the detailed differences between posed and spontaneous smiles is an important topic with a range of applications such as in human-computer interaction, automatic facial emotion analysis and in awareness systems. During the past decade or so, there have been very promising solutions for accurate automatic recognition and detailed facial emotion analysis. To this end, many methods and techniques have been proposed for distinguishing between spontaneous and posed smiles. Our aim here is to go beyond the present state of the art in this field. Hence, in this work, we are concerned with understanding the exact distribution of a smile – both spontaneous and posed – across the face. To do this, we utilise a lightweight computational framework which we have developed to analyse the dynamics of human facial expressions. We utilise this framework to undertake a detailed study of the smile expression. Based on computing the optical flow across the face – especially across key parts of the face such as the mouth, the cheeks and around the eyes – we are able to accurately map the dynamic weight distribution of the smile expression. To validate our computational model, we utilise two publicly available datasets, namely the CK + dataset in which the subjects express posed smiles and the MUG dataset in which the subjects express genuine smiles. Our results not only confirm what already exists in the literature – i.e. that the spontaneous genuine smile is truly in the eyes – but it also gives further insight into the exact distribution of the smile across the face
Is gender encoded in the smile? A computational framework for the analysis of the smile driven dynamic face for gender recognition
YesAutomatic gender classification has become a topic of great interest to the visual computing research community in recent
times. This is due to the fact that computer-based automatic gender recognition has multiple applications including, but not
limited to, face perception, age, ethnicity, identity analysis, video surveillance and smart human computer interaction. In this
paper, we discuss a machine learning approach for efficient identification of gender purely from the dynamics of a person’s
smile. Thus, we show that the complex dynamics of a smile on someone’s face bear much relation to the person’s gender.
To do this, we first formulate a computational framework that captures the dynamic characteristics of a smile. Our dynamic
framework measures changes in the face during a smile using a set of spatial features on the overall face, the area of the
mouth, the geometric flow around prominent parts of the face and a set of intrinsic features based on the dynamic geometry
of the face. This enables us to extract 210 distinct dynamic smile parameters which form as the contributing features for
machine learning. For machine classification, we have utilised both the Support Vector Machine and the k-Nearest Neighbour
algorithms. To verify the accuracy of our approach, we have tested our algorithms on two databases, namely the CK+ and the
MUG, consisting of a total of 109 subjects. As a result, using the k-NN algorithm, along with tenfold cross validation, for
example, we achieve an accurate gender classification rate of over 85%. Hence, through the methodology we present here,
we establish proof of the existence of strong indicators of gender dimorphism, purely in the dynamics of a person’s smile
Recommended from our members
The biometric characteristics of a smile
NoFacial expressions have been studied looking for its diagnostic capabilities in mental health and clues for longevity, gender and other such personality traits. The use of facial expressions, especially the expression of smile, as a biometric has not been looked into great detail. However, research shows that a person can be identified from their behavioural traits including their emotional expressions. In this Chapter, we discuss a novel computational biometric model which can be derived from the smile expression. We discuss how the temporal components of a smile can be utilised to show that similarities in the smile exist for an individual and it can be enabled to create a tool which can be utilised as a biometric
Recommended from our members
Gender and smile dynamics
NoThis chapter is concerned with the discussion of a computational framework to aid with gender classification in an automated fashion using the dynamics of a smile. The computational smile dynamics framework we discuss here uses the spatio-temporal changes on the face during a smile. Specifically, it uses a set of spatial and temporal features on the overall face. These include the changes in the area of the mouth, the geometric flow around facial features and a set of intrinsic features over the face. These features are explicitly derived from the dynamics of the smile. Based on it, a number of distinct dynamic smile parameters can be extracted which can then be fed to a machine learning algorithm for gender classification
Macronutrients Intake and Risk of Stomach Cancer: Findings from Case-Control Study
Studies on the association between gastric cancer (GC) and the intake of nutrients in Jordan are very limited, while findings from other reports on the intake of energy and macronutrients are controversial. This study aimed to examine the associations between intake of energy and macronutrients and the risk of GC in a Jordanian population. A case-control study was carried out between March 2015 and August 2018 in four major hospitals, including an oncology center in Jordan. Study participants were 173 cases with incident and histologically confirmed GC and 314 frequency-matched controls. Interview-based questionnaires were used to obtain the study’s information. Data on nutrient intake were collected using a validated Arabic food-frequency questionnaire (FFQ). Odds ratios (ORs) and their corresponding 95% confidence intervals (CIs) were calculated through multinomial logistic regression and adjusted for potential confounders, including age, marital status, education, body mass index (BMI), smoking, period of smoking, family history of gastric cancer, history of gastric ulcer, and physical activity. Intakes of total fat, saturated fat, monounsaturated fat, polyunsaturated fat, cholesterol, trans-fat, and omega-6 fatty acids were significantly associated with increased risk of GC. The ORs for the highest versus the lowest tertiles were 6.47 (95% Cl: 3.29–12.77), 2.97 (95% CI: 1.58–5.58), 6.84 (95% CI: 3.46–13.52), 6.19 (95% CI: 3.15–12.17), 3.05 (95% CI: 1.58–5.88), 8.11 (95% CI: 4.20–15.69), and 2.74 (95% CI: 1.47–5.09), respectively. No significant association was found for energy, protein, carbohydrate, sugar, fibers, and omega-3 fatty acids. The findings of this study suggest that high intake of selected types of fats was associated with an increased risk of GC.This research was funded by the Hashemite University [1403938/10/13/16AM]